01. Mean Squared Error Function
Log-Loss vs Mean Squared Error
In the previous section, Luis taught you about the log-loss function. There are many other error functions used for neural networks. Let me teach you another one, called the mean squared error. As the name says, this one is the mean of the squares of the differences between the predictions and the labels. In the following section I'll go over it in detail, then we'll get to implement backpropagation with it on the same student admissions dataset.
And as a bonus, we'll be implementing this in a very effective way using matrix multiplication with NumPy!